Meta Ends Third-party Fact-Checking, Adds 'Community Notes' System
2025-01-23
LRC
TXT
大字
小字
滚动
全页
1Facebook parent company Meta recently announced changes to the way it tries to identify misinformation and harmful material published on its social media services.
2Meta chief Mark Zuckerberg explained in a video that the company had decided to make the changes because the old system had produced "too many mistakes and too much censorship."
3Zuckerberg said the moderation system Meta had built needed to be "complex" to examine huge amounts of content in search of material that violated company policies.
4However, he noted the problem with such systems is they can make a lot of errors.
5The Meta chief added about such systems, "Even if they accidentally censor just one percent of posts, that's millions of people."
6So, he said the company had decided to move to a new system centered on "reducing mistakes, simplifying our policies, and restoring free expression."
7The new method turns over content moderation duties to a "Community Notes" system.
8The company said this system aims to "empower the community" to decide whether content is acceptable or needs further examination.
9The changes will be effective for Meta's Facebook, Instagram and Threads services.
10Meta said the new system would become available first to U.S. users in the coming months.
11Meta's former moderation system involved the use of independent, third-party fact-checking organizations.
12Many of these were large media companies or news agencies.
13The efforts included digital tools as well as human workers to fact-check content and identify false, inappropriate or harmful material.
14Meta said the third-party moderation method ended up identifying too much information for fact-checking.
15After closer examination, a lot of content should have been considered "legitimate political speech and debate."
16Another problem, the company said, was that the decisions made by content moderators could be affected by their personal beliefs, opinions and biases.
17One result was that "a program intended to inform too often became a tool to censor."
18Meta's new Community Notes system is similar to the method used by the social media service X.
19A statement by Meta said changes to this system will have to be made by users, not anyone from the company.
20Meta said, "Just like they do on X, Community Notes will require agreement between people with a range of perspectives to help prevent biased ratings."
21The company also invited any users to register to be among the first to try out the system.
22The International Fact-Checking Network (IFCN) criticized Meta's latest decision.
23It said the move threatened to "undo nearly a decade of progress."
24The group rejected Zuckerberg's claim that the fact-checking program had become a "tool to censor" users.
25It noted, that "the freedom to say why something is not true is also free speech."
26Milijana Rogač is executive editor of the Serbian fact-checking outlet Istinomer.
27She told Reuters news agency that she thinks Meta's decision would end up hurting the media industry.
28Rogač noted that research suggests that many citizens use Meta services as their main source for information.
29Removing independent fact-checkers "further hinders access to accurate information and news," Rogač said.
30Not a lot of research has been done on how effective Community Notes systems are.
31But one effort carried out by the University of California and Johns Hopkins University found in 2024 that community notes entered on X for COVID-19 misinformation were accurate.
32The research showed the notes used both moderate and high-quality sources and were attached to widely read posts.
33However, the number of people taking part in that study was small.
34Also, the effects the system had on users' opinions and behavior is unknown.
35A 2023 study, from the Journal of Online Trust and Safety, said it was harder for users to agree when they examined content related to political issues.
36I'm Bryan Lynn.
1Facebook parent company Meta recently announced changes to the way it tries to identify misinformation and harmful material published on its social media services. 2Meta chief Mark Zuckerberg explained in a video that the company had decided to make the changes because the old system had produced "too many mistakes and too much censorship." 3Issues with Meta moderation 4Zuckerberg said the moderation system Meta had built needed to be "complex" to examine huge amounts of content in search of material that violated company policies. 5However, he noted the problem with such systems is they can make a lot of errors. The Meta chief added about such systems, "Even if they accidentally censor just one percent of posts, that's millions of people." 6So, he said the company had decided to move to a new system centered on "reducing mistakes, simplifying our policies, and restoring free expression." 7"Community Notes" system 8The new method turns over content moderation duties to a "Community Notes" system. The company said this system aims to "empower the community" to decide whether content is acceptable or needs further examination. 9The changes will be effective for Meta's Facebook, Instagram and Threads services. Meta said the new system would become available first to U.S. users in the coming months. 10Meta's former moderation system involved the use of independent, third-party fact-checking organizations. Many of these were large media companies or news agencies. The efforts included digital tools as well as human workers to fact-check content and identify false, inappropriate or harmful material. 11Meta said the third-party moderation method ended up identifying too much information for fact-checking. After closer examination, a lot of content should have been considered "legitimate political speech and debate." 12Another problem, the company said, was that the decisions made by content moderators could be affected by their personal beliefs, opinions and biases. One result was that "a program intended to inform too often became a tool to censor." 13Meta's new Community Notes system is similar to the method used by the social media service X. A statement by Meta said changes to this system will have to be made by users, not anyone from the company. 14Meta said, "Just like they do on X, Community Notes will require agreement between people with a range of perspectives to help prevent biased ratings." The company also invited any users to register to be among the first to try out the system. 15How did fact-checkers react to Meta's change? 16The International Fact-Checking Network (IFCN) criticized Meta's latest decision. It said the move threatened to "undo nearly a decade of progress." 17The group rejected Zuckerberg's claim that the fact-checking program had become a "tool to censor" users. It noted, that "the freedom to say why something is not true is also free speech." 18Milijana Rogač is executive editor of the Serbian fact-checking outlet Istinomer. She told Reuters news agency that she thinks Meta's decision would end up hurting the media industry. Rogač noted that research suggests that many citizens use Meta services as their main source for information. Removing independent fact-checkers "further hinders access to accurate information and news," Rogač said. 19How effective are Community Notes? 20Not a lot of research has been done on how effective Community Notes systems are. But one effort carried out by the University of California and Johns Hopkins University found in 2024 that community notes entered on X for COVID-19 misinformation were accurate. The research showed the notes used both moderate and high-quality sources and were attached to widely read posts. 21However, the number of people taking part in that study was small. Also, the effects the system had on users' opinions and behavior is unknown. 22A 2023 study, from the Journal of Online Trust and Safety, said it was harder for users to agree when they examined content related to political issues. 23I'm Bryan Lynn. 24Bryan Lynn wrote this story, based on reports from Meta, The Associated Press and Reuters. 25_____________________________________________________ 26Words in This Story 27censorship - n. the system or practice of censoring information contained in books, movies, the internet, etc. 28moderate - v. to make sure the rules of an internet discussion are not broken 29content -n. writing, audio and visual material found online 30inappropriate -adj. something that is not right in a particular situation and which depends on people's opinions 31legitimate - adj. something real 32bias - adj. a situation in which you support or oppose something in an unfair way because you are influenced by personal opinions 33range -n. all elements of a series of objects or numbers 34perspective - n. the way person thins about something 35decade - n. a period of 10 years 36hinder - v. to make it difficult to do something 37accurate - n. true or correct